Collective Detection of Potentially Harmful Requests Directed at Web Sites

نویسنده

  • Marek Zachara
چکیده

The number of web-based activities and websites is growing every day. Unfortunately, so is cyber-crime. Every day, new vulnerabilities are reported and the number of automated attacks is constantly rising. Typical signature-based methods rely on expert knowledge and the distribution of updated information to the clients (e.g. anti-virus software) and require more effort to keep the systems up to date. At the same time, they do not protect against the newest (e.g. zero-day) threats. In this article, a new method is proposed, whereas cooperating systems analyze incoming requests, identify potential threats and present them to other peers. Each host can then utilize the findings of the other peers to identify harmful requests, making the whole system of cooperating servers “remember” and share information about the threats.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Anomaly-based Web Attack Detection: The Application of Deep Neural Network Seq2Seq With Attention Mechanism

Today, the use of the Internet and Internet sites has been an integrated part of the people’s lives, and most activities and important data are in the Internet websites. Thus, attempts to intrude into these websites have grown exponentially. Intrusion detection systems (IDS) of web attacks are an approach to protect users. But, these systems are suffering from such drawbacks as low accuracy in ...

متن کامل

A density based clustering approach to distinguish between web robot and human requests to a web server

Today world's dependence on the Internet and the emerging of Web 2.0 applications is significantly increasing the requirement of web robots crawling the sites to support services and technologies. Regardless of the advantages of robots, they may occupy the bandwidth and reduce the performance of web servers. Despite a variety of researches, there is no accurate method for classifying huge data ...

متن کامل

The State of the Cross-domain Nation

By deploying a configuration that allows the creation of client-side, cross-domain HTTP requests, a Web application weakens the same-origin policy. This enables sophisticated browser-based interaction which is not possible in the standard model, but also may lead to insecurities. In this paper, we briefly cover the technical background of client-side, cross-domain requests and explore the resul...

متن کامل

An On-Line Learning Statistical Model to Detect Malicious Web Requests

Detecting malicious connection attempts and attacks against web-based applications is one of many approaches to protect the World Wide Web and its users. In this paper, we present a generic method for detecting anomalous and potentially malicious web requests from the network’s point of view without prior knowledge or training data of the web-based application. The algorithm assumes that a legi...

متن کامل

Extracting Information from Web Content and Structure

Web is a vast data repository. By mining from this data efficiently, we can gain valuable knowledge. Unfortunately, in addition to useful content there are also many Web documents considered harmful (e.g. pornography, terrorism, illegal drugs). Web mining that includes three main areas – content, structure, and usage mining – may help us detect and eliminate these sites. In this paper, we conce...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014